Skip to main content
TrustRadius
Mirantis Kubernetes Engine

Mirantis Kubernetes Engine
Formerly Docker Enterprise

Overview

What is Mirantis Kubernetes Engine?

The Mirantis Kubernetes Engine (formerly Docker Enterprise, acquired by Mirantis in November 2019)aims to let users ship code faster. Mirantis Kubernetes Engine gives users one set of APIs and tools to deploy, manage, and observe secure-by-default, certified, batteries-included Kubernetes clusters…

Read more
Recent Reviews

TrustRadius Insights

Docker has proven to be a versatile tool with a wide range of use cases. Users have found that Docker simplifies the packaging and …
Continue reading

Save space and time!

9 out of 10
March 15, 2021
Incentivized
Docker is heavily used to containerized the projects and upload it to kubernetes. It is helpful when developing microservices. Due to …
Continue reading

Productivity Booster

10 out of 10
August 14, 2019
Incentivized
Docker is used by most of our teams as part of their development and deployment practice. For development, it enables engineers to build …
Continue reading
Read all reviews
Return to navigation

Pricing

View all pricing

Free

$0.00

Cloud
per year

Basic

$500.00

Cloud
per year

Entry-level set up fee?

  • No setup fee
For the latest information on pricing, visithttps://store.mirantis.com

Offerings

  • Free Trial
  • Free/Freemium Version
  • Premium Consulting/Integration Services

Starting price (does not include set up fee)

  • $500 per year per node
Return to navigation

Product Details

What is Mirantis Kubernetes Engine?

Mirantis Kubernetes Engine Technical Details

Deployment TypesSoftware as a Service (SaaS), Cloud, or Web-Based
Operating SystemsUnspecified
Mobile ApplicationNo

Frequently Asked Questions

The Mirantis Kubernetes Engine (formerly Docker Enterprise, acquired by Mirantis in November 2019)aims to let users ship code faster. Mirantis Kubernetes Engine gives users one set of APIs and tools to deploy, manage, and observe secure-by-default, certified, batteries-included Kubernetes clusters on any infrastructure: public cloud, private cloud, or bare metal.

Mirantis Kubernetes Engine starts at $500.

Reviewers rate Support Rating highest, with a score of 7.8.

The most common users of Mirantis Kubernetes Engine are from Enterprises (1,001+ employees).
Return to navigation

Comparisons

View all alternatives
Return to navigation

Reviews and Ratings

(210)

Community Insights

TrustRadius Insights are summaries of user sentiment data from TrustRadius reviews and, when necessary, 3rd-party data sources. Have feedback on this content? Let us know!

Docker has proven to be a versatile tool with a wide range of use cases. Users have found that Docker simplifies the packaging and deployment of applications and services, allowing developers to match their development environment to production and eliminate cross-cutting software dependencies. It has been utilized as the backbone of a hosted app infrastructure, where every element is broken down into microservices deployed on the AWS cloud. Additionally, Docker has been instrumental in creating specialized microservices such as a Selenium Grid for automated web-based testing.

Moreover, Docker has played a crucial role in maintaining environmental consistency and streamlining deployment processes. It has enabled users to swiftly containerize Continuous Deployment and Integration pipelines, facilitating easy deployment and updates of the system and its environments. With Docker, users have been able to quickly deploy and monitor servers, firewalls, switches, and other components, providing a consistent and efficient environment for prototyping and testing. Another notable use case is spinning up new databases for microservices using Docker, ensuring consistency and independence across different environments.

Furthermore, Docker has integrated seamlessly with orchestration frameworks like Apache Mesos and Mesosphere Marathon. This combination has allowed for more efficient application development and deployment through effective management of containers. Docker has also demonstrated its utility in building server deployment files and running tests, enabling consistent deployments and reliable testing procedures.

In addition to these technical applications, Docker has proved to be valuable in hosting MySQL databases for production websites. Its stability, security features, and easy provisioning of identical instances have made it a preferred choice for users. Moreover, Docker has been extensively used in CI builds as it enables the creation of custom Linux images and seamless deployment of the latest code from the Docker registry.

The flexibility offered by Docker comes to the forefront when it comes to testing practices. It provides a highly configurable environment that makes cross-platform testing significantly more efficient. Users have leveraged Docker for both automated website/application testing pipelines as well as creating flexible environments for manual testing. Moreover, Docker has acted as a viable alternative to custom build and deploy solutions, offering a more flexible and decentralized process.

Notably, Docker has been embraced by a large global financial services provider to enhance efficiency and agility in application development. This adoption has resulted in increased innovation and productivity within the organization. Another significant benefit of using Docker is its ability to provide identical application environments across multiple deployment environments, leading to the deployment of more stable applications.

Furthermore, Docker has played a role in differentiating between server/compute infrastructure and application infrastructure. Operations teams can efficiently manage the cluster of servers, while application developers can run containers on the cluster, ensuring a clear separation and easier management of the two layers.

Teams have leveraged Docker for various development and deployment practices. Engineers can build applications in the same environment, eliminating local configuration issues that often arise when working across different setups. Docker has been particularly useful for WordPress development, replacing tools like Vagrant and providing tighter integration with Windows Hyper-V and better performance.

One of the significant advantages of Docker is its ability to containerize applications, resulting in consistent deployment environments across different stages and compatibility with various cloud platforms. This has greatly simplified the deployment process for users and enhanced their productivity. Additionally, Docker has been highly beneficial for the development team in resolving issues related to different setups on Windows, Linux, and Mac operating systems, while also providing easy configurations for automation QA.

Docker's impact extends beyond software development into the realm of research reproducibility. Users have developed Docker containers to encapsulate research pipelines, leveraging GitHub and DockerHub as public repositories. This approach has effectively addressed the challenge of ensuring reproducibility in research experiments.

Moreover, Docker Swarm has been employed to deploy internal applications in a managed cluster, successfully tackling scaling and load balancing issues during peak business hours. The combination of Docker with Kubernetes has also gained popularity among teams for containerizing projects and facilitating the development of microservices.

Overall, Docker's value proposition lies in its ability to provide consistent development environments, prevent deployment issues, streamline configurations, enhance testing efficiency, and simplify the overall software packaging and deployment processes. Its widespread usage across various industries highlights its robustness, ease of setup, community support through open-sourced images, and its ability to create and test configurations as needed. Docker has become an indispensable tool for many organizations seeking to optimize their software development lifecycle while improving productivity and innovation.

Based on the reviews, here are the three most common recommendations:

  1. Users recommend trying Docker for deploying web services and running micro-services. They suggest doing tutorials to learn how to create Dockerfiles and docker-compose files correctly. Additionally, they advise considering whether Docker is necessary or if statically linked binaries can be used instead.

  2. Users also recommend using Docker for QA environments and setting up developers with the environment they need. They find Docker to be an easy-to-use development tool with great rewards for a small amount of effort. However, some users caution that while Docker is a good solution, there may be better alternatives available.

  3. Another common recommendation is to carefully consider the use of Docker in a workflow and discuss its usability within the organization before implementing it. Users emphasize the importance of learning the basics of Docker and understanding if continuous integration/deployment is the right approach. They also mention that Docker has a supportive community and is widely used in the industry.

Overall, users suggest experimenting with Docker, especially for new applications or running micro-services. They recommend taking advantage of Docker's simplicity and portability while being mindful of specific requirements and considering other options if needed.

Attribute Ratings

Reviews

(1-22 of 22)
Companies can't remove reviews or game the system. Here's why
Anuj Rai | TrustRadius Reviewer
Score 6 out of 10
Vetted Review
Verified User
Incentivized
Docker Enterprise is quite a handy solution when it comes to containerizing your application making it lightweight and easy to spin and access. Currently, it is being used across the whole organization and [is a] solution for every kind of complex problem.
  • Easy to control.
  • Setting up network across different containers is quite easy.
  • Mapping of resources with host machine is easy.
  • Setting up networking from scratch is painful.
  • Resources required for setting Docker Enterprise are huge.
  • User interface needs to be improved and made more user friendly.
Mirantis solution is really helpful when your critical application is containerised and if you are facing any kind of problems related to containers. You don't have to rely on the community for your issue, you can raise a ticket with the vendor and the resolution is quite fast.
It [is] quite expensive when it comes to pricing and almost all the features can be utilized using the community edition which is free.

March 15, 2021

Save space and time!

Score 9 out of 10
Vetted Review
Verified User
Incentivized
Docker is heavily used to containerized the projects and upload it to kubernetes. It is helpful when developing microservices. Due to Dockers isolation and portability feature, it is easy to deploy run and get a microservice up in no time. Docker is being used across the whole organization. Docker address following business problems: building independent microservices, isolation and easy potability.
  • Manage software application easily
  • Distribute apps within the team or organization
  • Saves space
  • Security is still a concern
  • Docker is difficult to use when using different operating system
  • Docker is an evolving technology which involves a learning curve
I would definitely recommend Docker to my colleagues if they are planning to build a microservice. The containers not only saves space but also time. The ease of portability helps to pass it among the team and helps them to get the setup ready in no time. It is a great way to save some developers repetitive task.
Score 10 out of 10
Vetted Review
Verified User
Incentivized
Docker is everywhere, there just isn't a server on an application which is not present in Docker. It forms the integral part of the whole infra for us. The beauty of Docker comes from its amazing quality of being robust, easy to start and very easy to blow it off completely. It's the most powerful tool which just does magic for us.
  • Robust.
  • Easy to setup.
  • The kernel cannot be changed.
For CI/CD it is the best tool to use. If you want to manage an infra where there are millions of machines needed you need to start using Docker if you are not.
Matt James | TrustRadius Reviewer
Score 8 out of 10
Vetted Review
Verified User
Incentivized
We are currently using Docker in a test environment to deploy and monitor all of our servers/firewalls/switches/etc. throughout our company. We have a single server instance that houses all the containers and images. My department, the technology services department, is the only department that uses this and as it is still only being tested only one user is using/deploying/managing it—me. But it allows me to have a glance at each location to see if there are any issues that could potentially take down a site.
  • Usability is great after the initial setup.
  • Installation is a breeze.
  • The ability to knock down a container and rebuild it from scratch is fantastic.
  • It would be nice if Docker had its own frontend GUI.
  • The CLI is very difficult unless you have decent amount of Linux experience.
  • Stacks are still a mystery to me.
It would really all depend on what they are looking to do. We are planning on using it as a monitoring tool for our locations. There are tons of different ways Docker can be used so as it said, it depends on the use case. Not only do I use Docker for my company but I use it at home as well and there it is a beautiful and amazing tool for HTPC users, I just wish I had found it sooner.
July 13, 2019

Linux everywhere!

David Tanner | TrustRadius Reviewer
Score 9 out of 10
Vetted Review
Verified User
Incentivized
Docker is used by our company to build our server deployment files and to run tests. This allows us to have confidence that our deployments will work correctly in our pull request tests. Developers can also be confident that the build will run the same every time no matter where the code is being run.
  • The OSX management tool is simple to use.
  • It is nice to be able to use custom repositories.
  • The service runs mostly in the background now, and I don't have to tinker with it .
  • Sometimes issues arise running images, that are only cleared by removing the cache and restarting the OSX app.
  • It is easy to build up a lot of containers that aren't being used, and you have to manually clear them up.
  • It would be nice to have a better graphical interface to see what is going on internally.
Docker gives developers flexibility and repeatable outcomes. It is very useful for developing with confidence and knowing that all environments will behave the same. Not all developers like to use Linux for developing, so being able to run a Linux instance on Windows allows team members to develop on their OS of choice.
Score 9 out of 10
Vetted Review
Verified User
Incentivized
Instead of using VMs for our testing environments in our automated pipelines, we use Docker containers to simplify and increase the efficiency of our testing. We needed a testing environment that worked both for Windows and Linux, so Docker was the best choice for our scenario. It is being used on a team-by-team basis.
  • Containers - Docker is the go-to when using Containers, which are super useful if you need an environment that works both for Windows and Linux
  • Efficiency - Docker is very lightweight and doesn't demand too much from your CPU or server
  • CI/CD - Docker is excellent for plumbing into your build pipeline. It integrates nicely, is reliable, and has an easy set up.
  • Security - Since there's no true operating system, you're pretty limited when it comes to security in Docker. But that's with all containers.
  • Not totally isolated - Docker containers all use the same kernel, so if you've got multiple Docker containers up on one server, you could run into some issues.
  • Network connectivity - There's a fine line between limiting network access but also having proper communication where needed, since you don't have a full OS with Docker
Docker is great for when you would want to use a VM for any given application, but don't need the overhead of the whole OS. Docker containers use very little computing resources, boot up very quickly, and are very easy to set up. An instance where Docker may not be appropriate would be for an application that requires good security. If in this situation, a true VM would probably be your best bet.
Ben Lachman | TrustRadius Reviewer
Score 7 out of 10
Vetted Review
Verified User
Incentivized
We use Docker as part of a rapid deployment project that allows a service to be easily deployed directly onto VMs automatically during staging and production. It makes the management of the VM a parallel task to the deploy process. Traditionally the provision of a VM would be intertwined with the deploy process and containerization allows for these things to be decoupled.
  • Containerization - allowing multiple micro-services to function together without in-depth orchestration at the VM level.
  • Rapid deployment - a developer with appropriate access can simply push to the correct remote and the deploy happens automatically from there
  • Decouples provisioning from VM administration - allows containers to be deployed (more) regardless of VM set up.
  • Containers are often opaque - if a container doesn't work out of the box, it's messy to fix.
  • Logging is complexified by the multiple containers and logs are often not piped to places you expect them to be.
  • Networking is complexified due to internal port mapping between containers, etc.
Docker is great for staging and quickly deploying small to medium projects. With larger projects, it can become a significant challenge to manage all of the containers used for multiple microservices, keeping them up to date, secure and portable to other platforms. One of the goals of Docker is to allow the macro service to be platform agnostic and this can sometimes be more of a challenge than its long-term benefit.
Score 10 out of 10
Vetted Review
Verified User
Incentivized
Docker is truly an amazing tool that is used across our organization. It gives the developer tools to easily set up environments, deploy code, CI pipeline. Open sourced images and community supports makes it a great choice.
  • Setting up Docker containers helps developers to replicate the production environment frim their local machine in a virtual box. This helps keep development and debugging simple.
  • Portability is really helpful. You can easily shift from AWS to GCP within minutes.
  • Docker images are version-controlled just like github commits.
  • User friendly - creating the virtual environment takes a lot more time than running the shell script to set up the environment.
  • Docker containers are for running applications and not for data containers. Having that feature would be awesome.
  • Docker image and containers prune command to force-delete all the images and containers as a cleanup.
It is best managed with cloud providers and setting up your CI pipeline. You probably would set up your images with access to file system,volume, environment variable.
January 30, 2018

Quick Docker Review

Score 10 out of 10
Vetted Review
Verified User
Incentivized
Docker is used across our whole engineering organization in order to have a consistent dev environment for local testing. We also use Docker for our microservices on Rancher. Docker is extremely useful as we can easily spin up any sort of environment we want and create/test new features. The use of Docker also helps prevent those "it work on my computer" type of issues.
  • Flexibility
  • Ease of Use
  • Very powerful
  • Can be seen as a black box
  • Hard to debug if unfamiliar with it
  • Semi-steep learning curve
Docker is well suited if you want to test new technologies or just having a consistent test environment across different machines. Docker also allows you to easily share your current local environment with anyone else regardless of their system. One drawback of Docker is the need to learn some of the quirks such as learning how to map ports and IPs to be accessible from your local machine. In the case where you don't need a strict environment control and only need to do some quick tests, docker can be overkill.
August 16, 2017

Docker FTW!

Score 10 out of 10
Vetted Review
Verified User
Incentivized
We use Docker to containerize our applications, we get many benefits from this such as:
  • consistent, realizable deployment environments across dev, QA, prod - the same image used in dev is the exact same image deployed to production
  • better utilization of server resources
  • cross cloud compatibility
  • the ease of scaling applications
  • Docker makes deployments easier across environments.
  • Docker allows to better utilization of server resources by easily allowing multiple applications (images) to run on the same server.
  • Docker makes it easy to scale our applications out.
  • Docker is somewhat new and new functionality comes with each release, sometimes it can be hard to stay on top of all the new features.
  • It would be nice if a full GUI based container management system came with Docker.
Docker is best suited for deploying Linux based apps. Eventually, it should (or will) be suited for Windows based apps as well.
Score 10 out of 10
Vetted Review
Verified User
Incentivized
Docker is used for by both the dev team and the QA team on my project. For the dev team it's really useful as they had a lot of issues prior to using Docker with the different setups the devs had: Win/Linux/Mac. After switching to Docker these issues disappeared.

For me as an automation QA lead, it's mainly used for our Selenium Grid. Our grid is running on AWS, and I configured it via Docker. I use docker-compose to start it up and to scale how many browsers should be started. Using only Docker was already a huge help, as we didn't really have to worry about the configurations and it was easy to use the same setup for more instances, but combined with the scaling option of docker-compose it proved to be a really convenient.
  • Develop on multiple platforms. The same Docker image can be used on Linux/Mac/Windows.
  • Ease of configuration. It's very easy to create a base image for your project. There are a lot of already existing images you can use to start with.
  • Scalability. If you need more than just one instance of the same image, it's just a command to spin up more.
  • Finding the perfect configuration: it's very easy to find some basic configurations, but fine-tuning it can be challenging.
  • Understanding the concept can be difficult at first. Most of the question I get from colleagues are around: what's happening inside the docker, how we can see the logs what happens inside etc. One you have the concepts, you can easily do these, but this can be a rough beginning.
  • Sometimes difficult to set it up. I'm mainly hearing about this from colleagues using Windows.
I most certainly would encourage everyone to try it. It might not be a good fit for their needs, but knowing about it definitely helps. For me it's very useful because of the way we can set up Selenium Grid with it. As official images are released for it, setting up a working Selenium Grid can be done in 1-2 single commands. If you use Docker Compose it's even easier to spin it up, just create a YML file describing the browsers you want to use, and with one single line you can spin up a grid with X number of different browsers and browser instances.
Score 7 out of 10
Vetted Review
Verified User
Incentivized
A large global financial services provider based in London, faced increasing regulatory pressure and market demands—led by industry disruptors offering modern, digital services. Looking to increase innovation and productivity, Barclays set out to build an Application Platform-as-a-Service as part of its cloud program. It used Red Hat OpenShift Container Platform which incorporates Docker, along with other Red Hat solutions to update its IT infrastructure and adopt an agile, DevOps approach to application development, giving its developers on-demand, self-service capabilities. As a result, the bank improved its efficiency and agility to innovate faster and stay competitive.
  • Docker brings in an API for container management, an image format and a possibility to use a remote registry for sharing containers. This scheme benefits both developers and system administrators.
  • Docker allows for portability across machines. The application and all its dependencies can be bundled into a single container that is independent of the host version of Linux kernel, platform distribution, or deployment model. This container can be transfered to another machine that runs Docker and executed there without compatibility issues.
  • Docker has a lightweight footprint and minimal overhead. Docker images are typically very small, which facilitates rapid delivery and reduces the time to deploy new application containers.
  • Docker allows for sharing. You can use a remote repository to share your container with others.
  • Docker provides great version control and component reuse. You can track successive versions of a container, inspect differences, or roll-back to previous versions. Containers reuse components from the preceding layers, which makes them noticeably lightweight.
  • Docker has got into the bad habit of wrapping open source Linux technologies and promoting them in a way that makes it feel like Docker invented it. They did it to LXC and they are doing it to aufs and overlayfs.
  • Docker is not very developer friendly.
  • Docker containers are currently for software, not for data.
  • New Docker versions cause breakage. You need all kinds of subtle regressions between Docker versions. It’s constantly breaking unpredictable stuff in unexpected ways.
  • Docker does not have a command to clean older images, lifecycle management.
  • Lack of kernel support.
Each Docker container’s purpose is to run a single application. As such, the scope for a Docker container is built towards a particular application, as opposed to an entire operating system. The file system inside a Docker container is isolated to provide an environment similar to a VM. Docker further incorporates container management solution that allows for easy scripting and automation. There is a strong focus on execution time for containerized applications and the ease of scripting. For developers looking for a performance breakdown between a Docker container and virtual machines, a container will win every time. That being said, some applications don't respond well to running in a container, such as containers that have high IO and need high performance persisted data mounted across multiple nodes.
June 26, 2017

Testing with Docker

Score 7 out of 10
Vetted Review
Verified User
Incentivized
Docker is being used by us to create and throw away spaces as needed for testing. Instead of managing a huge hardware lab we are able to "spin up" configurations as needed. If we need a new configuration to test against we just build a new container. It makes life more simple.
  • Docker is fast.
  • Docker is well documented.
  • Docker has public container registries.
  • Docker storage is still hard.
  • Docker has poor monitoring.
  • Docker is platform-dependent.
One of the coolest things about Docker that people tend to overlook, I think, is the way it has made public repositories the go-to way to distribute and install software. I’m referring to Docker Hub, which hosts thousands of container images that anyone can grab in just a single command.
Anudeep Palanki | TrustRadius Reviewer
Score 9 out of 10
Vetted Review
Verified User
Incentivized
Recently at Monsanto, there is a big push towards a DevOps model and micro-services. As our first step towards moving to the cloud, we started using docker to spin up new databases for various micro-services. When moving towards micro-services, we need a simple and consistent way to spin up database instances that do not affect each other. We needed consistency because we want the instances to be same across different environments.
  • Simple and reliable way to replicate instances.
  • Not needing to worry about internal workings of the instance as Docker takes care of managing the instance.
  • Very well documented API with large community support.
  • Verifiable Docker files, that allow us to look at what exists within a Docker file.
  • Managing backups of Docker instances does not scale well as the size of instance grows. The entire Docker instance needs to be stopped for the backups to happen and it's not always scalable.
  • While there are a lot of useful methods on CLI. The API for CLI is slow to evolve, leaving much to be desired. For example executing commands on the Docker instances, maintaining instances requires hacks using the CLI.
  • Writing a Docker file and debugging it is not always intuitive. Requires some trial and error to get it right.
Well suited for:

  • Small scale persistent databases.
  • Replicating the runtime environment.
  • It's also well suited for use with micro-services, where multiple small size databases need to spin up easily and consistently across environments.
It's less appropriate for database instances where backing up instances is not always scalable. It also does not fit well where monitoring instances is important, it requires a lot of additional code to manage instances.
Tom Paulus | TrustRadius Reviewer
Score 8 out of 10
Vetted Review
Verified User
Incentivized
Docker allows us to provision identical instances across our various systems (testing, staging, production, etc.). Docker has also allowed us to drastically reduce our spin up time for new instances, as all of the components that we commonly use have been converted into Docker Files.

Additionally, because of the great community behind Docker, many of the components that we use (MySQL, Tomcat, etc.) already have Docker files for them, many of which are awesome, and are easily adaptable (if necessary) to best suit the needs of our department.
  • Easy to understand, with excellent documentation and community support.
  • Easy to deploy to a variety of platforms.
  • Allows for containers to be quickly be built, destroyed, transferred, all while keeping them consistent.
  • Docker files can be limiting, because of the core idea of Docker, with only one process per container.
  • Debugging DockerFiles can be a nightmare.
  • Some configurations for a container cannot be updated post creation.
  • There can be some trial-and-error associated with deploying containers and their corresponding Docker files.
Docker makes it super quick and easy to deploy a new app, especially useful when you want to try out something new, without committing your whole system to it. Most Docker Images are clean and light and do not add a significant amount of overhead to a production system.
Score 8 out of 10
Vetted Review
Verified User
Incentivized
We are using Docker today to spread multiple Tomcat instances across a single machine. Docker is currently being used by our devops team but we're a small company so that's pretty much the entire infrastructure team as well. Docker helps us keep our configs simple, easy to use, and reproducible in a really efficient manner.
  • Docker makes it very easy to reproduce a service build and configuration. This is huge for rolling out quickly and efficiently.
  • Docker can orchestrate your containers to auto scale up and down with Docker Compose. This is very useful on cloud providers where you pay for instance to keep prices down.
  • Docker's intra-container networking works well for the most part but it does leave something to be desired when attempting to weave a complex deployment of microservices across multiple bare metal machines and networks. It would be nice to introduce some sort of modeling tool into container networking.
  • A really neat feature for Docker could be to have an option to analyze container utilization and alert or notify on suggestions to improve efficiency.
Docker is well suited for any environment with a microservices architecture and a need for efficient use of hardware. It is important to not try and mold a non-conforming infrastructure into containers that run more than one service or perform multiple actions. That type of infrastructure should first be ported to microservices and then containerized.
March 24, 2017

Docker rocks!

Jesse Bye | TrustRadius Reviewer
Score 9 out of 10
Vetted Review
Verified User
Incentivized
We are just beginning to use Docker for some specialized microservices within our existing server infrastructure. Specifically, we use it to run a Selenium Grid for automated web-based testing. We are considering broader adoption of Docker within areas such as Java application deployment, local development environments, and continuous integration. Docker primarily helps us maintain environmental consistency (having the same environment from local development to deployment in the cloud).
  • abstracting the virtualization aspects so that I don't need to know every detail (even to the point of not needing to know if Docker is using a VM behind the scenes or not)
  • providing a simple yet powerful configuration scheme
  • huge selection of base containers and easy way to derive from them
  • automated builds through Docker Hub
  • multiple configuration file versions can be a little confusing
  • experienced some downtime with Docker Hub, though it was cleared up quickly
  • not really a con of Docker, but it takes some time to learn the concepts of containers and adapt to that way of thinking. Perhaps it would be helpful to have a "Docker for Old School Sys Admins" guide that helps explain some of the differences in concepts and execution when working with containers.
Docker seems to be well suited for small services, but not as much for larger monolithic applications. If your architecture lends itself well to segmenting into small, interlinked services, then Docker is an excellent candidate. However, I would be cautious about spending a lot of time re-architecting your entire platform if it is more monolithic. Docker is incredible for what it does, but it will not magically make your giant million lines of code application better. I would definitely recommend considering Docker though if you are refactoring or reworking pieces of your application. There's no reason you can't adopt it in a few places, and gradually increase adoption as it makes sense to do so.
Adam Eivy | TrustRadius Reviewer
Score 10 out of 10
Vetted Review
Verified User
Incentivized
Docker is transforming our confidence in build and release as well as developer onboarding. Docker containerization finally is fulfilling the promise that Chef never did, giving us environmental consistency across developers, build, various environments and production. We've been eliminating deployment time errors by encapsulating the entire operating system, language core components, security patches, etc., into the application build time. This has reduced the complexity of getting developers up and running. No longer do developers and operations have to understand the full workings of the dependencies within an application in order to run and deploy it--instead, we only need to know how to get Docker running and deployed to get our applications up and running. This allows us to have truly ephemeral environments and dependency management and eases autoscaling.
  • Environment consistency via full application and operating system encapsulation
  • Securing software runtime by ensuring that the whole environment is easily and quickly discarded and re-run from a known good state--as well as putting all dependencies of the operating system and patches into the built artifact
  • Easing developer setup time (up and running immediately without installing various software dependencies and configuring ports/etc.)
  • The ecosystem has many minimal base images for software but this could use more focus on secure base images
  • Many useful Docker commands are not built in as shortcuts to the CLI, but instead need to be managed as other aliases (e.g. `docker rm $(docker ps -a)` to remove all running and stopped containers)
  • It's not always easy for people to optimize the caching layers of docker images--an auditing tool that suggests the order of Dockerfile commands for cache optimization would be handy
Honestly, sometimes I skip the use of Docker when developing Node.js apps since they encapsulate the web server component and make runtime really easy--but for deployment, I always build a Docker image--it's the only way to know that what I'm deploying is what the build server tested. Additionally, when onboarding new developers on complex services, I've found Docker to be invaluable--now we just say, "run the Docker compose" instead of "install this, then that, then configure these ports, then make sure your OS is the right version for this dependency and stop this other service with conflicting ports before you run this, etc."
Brad Magyar | TrustRadius Reviewer
Score 8 out of 10
Vetted Review
Verified User
Incentivized
Docker is being used primarily to host a MySQL database that runs a production website. It has been very stable and easy to work with, and we like the security that containerization affords.
  • Security by isolation.
  • Ease of deployment.
  • Flexible configuration.
  • Scalability.
  • Resource management.
  • Administration simplicity.
Excellent for the fast deployment of applications or configurations from one system to another or to many distributed systems.
Claudio Fernando Maciel | TrustRadius Reviewer
Score 10 out of 10
Vetted Review
Verified User
Incentivized
We use Docker to provide us fast containerization of our Continuous Deployment and Integration pipelines. Once our code is good for shipping, we trigger a test pipeline which will in turn compile all its dockerfiles, upload them to hub.docker.com if needed and then upload/install an updated version of the system and its environments at DigitalOcean via Docker drivers and swarm. Our developers as well as our production servers use it as well, being our stack composed of a total of 4 different nodes, a MongoDB container; an elasticsearch container; a nodejs container and our discovery service container, comprising Consul key-value database to store all data from our slave nodes. It's solely maintained by our development team, but the system built within is widely used by our staff as well as the company's clients, spread throughout the world.
  • Its topology isolation is in my opinion an unbeatable feature. In our systems we have the need of parallel Java 7 and 8 versions to be running together. Without Docker that would not have been made possible.
  • Docker Swarm, taking care of our load-balance characteristics so needed for our systems is a must have.
  • Docker composer is a very powerful feature, therein I can have my containers scripted and each of its continuous integration and deployment separated with each of its own concerns isolated whilst all being nicely bootstrapped together under the same "docker-compose up" command.
  • Some commands are not very intuitive. In order to have an entire swarm properly functioning [specifically for the scenario we have at our company] wasn't a simple task, having to maintain a very wide range of environment variables safely and nicely kept and good for use. The pipeline to have such a topology ready wasn't simple to figure out how to come up with.
  • Some volumes, if not properly shut down when its necessary, will take up to all your disk space. The extra -v attribute wasn't too obvious to use when removing an specific volume leading us to a huge headache.
  • Some containers, though exposed as official ones at docker.hub.com, are very space and memory consuming. We have do figure out our own containers for pretty much everything, even though the services that were necessary in the containers were pretty vanilla.
It's excellent for Continuous Integration and Continuous Deployment. Simple, savvy serviced based containers that can be fired at the simple script command. If you need to have your system promptly up and running, Docker is a perfect choice, even for the unskilled user, as it can be configured to run automatically via scripting by the technical staff. It provides a very elegant way of guaranteeing that all the environments are in sync throughout the company. A developer may have its own machine, but it will always match the production and staging servers.
March 18, 2016

Docker in play

Score 8 out of 10
Vetted Review
Verified User
Incentivized
Its simple and easy to build and run containers on bare metal VMs without much sys admin experience.
  • Developers are able to set up workstation in their local in couple of secs.
  • docker image pull is taking more time.
  • Containers are crashing some times due to teh file system or daemon issue.
Docker allows us to scale horizontally and it's providing immutability across profiles.
March 14, 2016

Docker Rocker

Linju Jose | TrustRadius Reviewer
Score 9 out of 10
Vetted Review
Verified User
Incentivized
We use docker in our CI builds from creating a custom Linux image to deploying our latest code from docker registry
  • Simplicity/ Efficiency
  • Isolation/ Separation of Concerns
  • Works well with cloud deployments using services like AWS
  • Supports build automation with docker registry
  • I understand docker is evolving very well, however wish there were more logging support
  • A Docker dashboard that gives insights and statistics
Well suited when paired with cloud services. Helps regular automation too, however more handy while using shell scripts for AWS build automation.

If it is a simple project or just to set up developer environment as a local virtual machine, it might be an over kill. Wish it worked straight away on Mac
Return to navigation